31 research outputs found
Bayesian Neural Word Embedding
Recently, several works in the domain of natural language processing
presented successful methods for word embedding. Among them, the Skip-Gram with
negative sampling, known also as word2vec, advanced the state-of-the-art of
various linguistics tasks. In this paper, we propose a scalable Bayesian neural
word embedding algorithm. The algorithm relies on a Variational Bayes solution
for the Skip-Gram objective and a detailed step by step description is
provided. We present experimental results that demonstrate the performance of
the proposed algorithm for word analogy and similarity tasks on six different
datasets and show it is competitive with the original Skip-Gram method
Item2vec: Neural Item Embedding for Collaborative Filtering
ABSTRACT Many Collaborative Filtering (CF) algorithms are item-based in the sense that they analyze item-item relations in order to produce item similarities. Recently, several works in the field of Natural Language Processing (NLP) suggested to learn a latent representation of words using neural embedding algorithms. Among them, the Skip-gram with Negative Sampling (SGNS), also known as Word2vec, was shown to provide state-of-the-art results on various linguistics tasks. In this paper, we show that item-based CF can be cast in the same framework of neural word embedding. Inspired by SGNS, we describe a method we name Item2vec for item-based CF that produces embedding for items in a latent space. The method is capable of inferring item-item relations even when user information is not available. We present experimental results that demonstrate the effectiveness of the Item2vec method and show it is competitive with SVD